翻訳と辞書 |
Joint entropy : ウィキペディア英語版 | Joint entropy
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. ==Definition== The joint Shannon entropy of two variables and is defined as : where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if . For more than two variables this expands to : where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Joint entropy」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|